2024-03-27 15:04:15,969 [ 94903 ] INFO : ClickHouse root is not set. Will use /home/ubuntu/_work/_temp/test/git-repo-copy (runner:41, check_args_and_update_paths) 2024-03-27 15:04:15,969 [ 94903 ] INFO : Cases dir is not set. Will use /home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration (runner:89, check_args_and_update_paths) 2024-03-27 15:04:15,969 [ 94903 ] INFO : src dir is not set. Will use /home/ubuntu/_work/_temp/test/git-repo-copy/src (runner:96, check_args_and_update_paths) 2024-03-27 15:04:15,969 [ 94903 ] INFO : base_configs_dir: /home/ubuntu/_work/_temp/test/git-repo-copy/programs/server, binary: /home/ubuntu/_work/_temp/test/build/clickhouse, cases_dir: /home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration (runner:98, check_args_and_update_paths) clickhouse_integration_tests_volume WARNING: Ignoring custom format, because both --format and --quiet are set. Running pytest container as: 'docker run --rm --name clickhouse_integration_tests_4pasye --privileged --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-odbc-bridge:/clickhouse-odbc-bridge --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-library-bridge:/clickhouse-library-bridge --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/src/Server/grpc_protos:/ClickHouse/src/Server/grpc_protos --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e XTABLES_LOCKFILE=/run/host/xtables.lock -e PYTHONUNBUFFERED=1 -e DOCKER_DOTNET_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_HELPER_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_BASE_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_KERBERIZED_HADOOP_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_KERBEROS_KDC_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_JAVA_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_JS_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_PHP_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e PYTEST_OPTS='--dist=loadfile -n 10 -rfEps --run-id=0 --color=no --durations=0 test_fetch_partition_should_reset_mutation/test.py::test_part_should_reset_mutation test_filesystem_layout/test.py::test_file_path_escaping test_format_schema_on_server/test.py::test_protobuf_format_input test_format_schema_on_server/test.py::test_protobuf_format_output test_graphite_merge_tree_typed/test.py::test_combined_rules test_graphite_merge_tree_typed/test.py::test_multiple_output_blocks test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_plain test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_tagged test_graphite_merge_tree_typed/test.py::test_path_dangling_pointer test_graphite_merge_tree_typed/test.py::test_paths_not_matching_any_pattern test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_plain test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_tagged test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_plain test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_tagged test_graphite_merge_tree_typed/test.py::test_rollup_versions_all test_graphite_merge_tree_typed/test.py::test_rollup_versions_plain test_graphite_merge_tree_typed/test.py::test_rollup_versions_tagged test_graphite_merge_tree_typed/test.py::test_rules_isolation test_graphite_merge_tree_typed/test.py::test_system_graphite_retentions test_http_and_readonly/test.py::test_http_get_is_readonly test_inherit_multiple_profiles/test.py::test_combined_profile test_insert_distributed_async_extra_dirs/test.py::test_insert_distributed_async_send_success test_keeper_nodes_remove/test.py::test_nodes_remove test_keeper_snapshot_small_distance/test.py::test_snapshot_and_load test_keeper_znode_time/test.py::test_between_servers test_keeper_znode_time/test.py::test_server_restart test_kerberos_auth/test.py::test_bad_path_to_keytab test_kerberos_auth/test.py::test_kerberos_auth_with_keytab test_kerberos_auth/test.py::test_kerberos_auth_without_keytab test_merge_tree_s3_failover/test.py::test_move_failover test_merge_tree_s3_failover/test.py::test_throttle_retry test_merge_tree_s3_failover/test.py::test_write_failover[0-13-2] test_merge_tree_s3_failover/test.py::test_write_failover[1048576-9-0] test_mutations_with_projection/test.py::test_mutations_with_multi_level_merge_of_projections test_odbc_interaction/test_exiled.py::test_bridge_dies_with_parent test_relative_filepath/test.py::test_filepath test_reload_clusters_config/test.py::test_add_cluster test_reload_clusters_config/test.py::test_delete_cluster test_reload_clusters_config/test.py::test_simple_reload test_reload_clusters_config/test.py::test_update_one_cluster test_replica_can_become_leader/test.py::test_can_become_leader test_replica_is_active/test.py::test_replica_is_active test_restore_replica/test.py::test_restore_replica_alive_replicas test_restore_replica/test.py::test_restore_replica_invalid_tables test_restore_replica/test.py::test_restore_replica_parallel test_restore_replica/test.py::test_restore_replica_sequential test_s3_storage_class/test.py::test_s3_storage_class_right test_s3_zero_copy_replication/test.py::test_s3_zero_copy_concurrent_merge test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_alter test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_system test_s3_zero_copy_replication/test.py::test_s3_zero_copy_keeps_data_after_mutation test_s3_zero_copy_replication/test.py::test_s3_zero_copy_replication[s3] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_alter test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_system test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[False-10] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[True-3] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-False-10] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-True-3] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-False-10] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-True-3] test_secure_socket/test.py::test test_send_crash_reports/test.py::test_send_segfault test_server_initialization/test.py::test_live_view_dependency test_server_initialization/test.py::test_partially_dropped_tables test_server_initialization/test.py::test_sophisticated_default test_server_start_and_ip_conversions/test.py::test_restart_success_ipv4 test_server_start_and_ip_conversions/test.py::test_restart_success_ipv6 test_storage_hdfs/test.py::test_bad_hdfs_uri test_storage_hdfs/test.py::test_cluster_join test_storage_hdfs/test.py::test_cluster_macro test_storage_hdfs/test.py::test_format_detection test_storage_hdfs/test.py::test_globs_in_read_table test_storage_hdfs/test.py::test_hdfsCluster test_storage_hdfs/test.py::test_hdfsCluster_skip_unavailable_shards test_storage_hdfs/test.py::test_hdfsCluster_unskip_unavailable_shards test_storage_hdfs/test.py::test_hdfs_directory_not_exist test_storage_hdfs/test.py::test_insert_select_schema_inference test_storage_hdfs/test.py::test_multiple_inserts test_storage_hdfs/test.py::test_overwrite test_storage_hdfs/test.py::test_partition_by test_storage_hdfs/test.py::test_read_files_with_spaces test_storage_hdfs/test.py::test_read_table_with_default test_storage_hdfs/test.py::test_read_write_gzip_table test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_auto_gz test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_gzip test_storage_hdfs/test.py::test_read_write_storage test_storage_hdfs/test.py::test_read_write_storage_with_globs test_storage_hdfs/test.py::test_read_write_table test_storage_hdfs/test.py::test_read_write_table_with_parameter_none test_storage_hdfs/test.py::test_schema_inference test_storage_hdfs/test.py::test_schema_inference_cache test_storage_hdfs/test.py::test_schema_inference_with_globs test_storage_hdfs/test.py::test_seekable_formats test_storage_hdfs/test.py::test_truncate_table test_storage_hdfs/test.py::test_virtual_columns test_storage_hdfs/test.py::test_virtual_columns_2 test_storage_hdfs/test.py::test_write_gz_storage test_storage_hdfs/test.py::test_write_gzip_storage test_storage_hdfs/test.py::test_write_table test_storage_kerberized_hdfs/test.py::test_cache_path -vvv' altinityinfra/integration-tests-runner:0-0a8ac3b092733da37e3e2a0079c486938a36790d '. Start tests ============================= test session starts ============================== platform linux -- Python 3.8.10, pytest-8.1.1, pluggy-1.4.0 -- /usr/bin/python3 cachedir: .pytest_cache rootdir: /ClickHouse/tests/integration configfile: pytest.ini plugins: timeout-2.3.1, repeat-0.9.3, xdist-3.5.0, random-0.2, order-1.0.0 timeout: 900.0s timeout method: signal timeout func_only: False created: 10/10 workers 10 workers [100 items] scheduling tests via LoadFileScheduling test_storage_hdfs/test.py::test_bad_hdfs_uri test_restore_replica/test.py::test_restore_replica_alive_replicas test_graphite_merge_tree_typed/test.py::test_combined_rules test_server_initialization/test.py::test_live_view_dependency test_keeper_znode_time/test.py::test_between_servers test_merge_tree_s3_failover/test.py::test_move_failover test_reload_clusters_config/test.py::test_add_cluster test_format_schema_on_server/test.py::test_protobuf_format_input test_s3_zero_copy_replication/test.py::test_s3_zero_copy_replication[s3] test_kerberos_auth/test.py::test_bad_path_to_keytab [gw8] [ 1%] PASSED test_format_schema_on_server/test.py::test_protobuf_format_input test_format_schema_on_server/test.py::test_protobuf_format_output [gw8] [ 2%] PASSED test_format_schema_on_server/test.py::test_protobuf_format_output [gw0] [ 3%] PASSED test_graphite_merge_tree_typed/test.py::test_combined_rules test_graphite_merge_tree_typed/test.py::test_multiple_output_blocks [gw0] [ 4%] PASSED test_graphite_merge_tree_typed/test.py::test_multiple_output_blocks test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_plain [gw0] [ 5%] PASSED test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_plain test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_tagged [gw0] [ 6%] PASSED test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_tagged test_graphite_merge_tree_typed/test.py::test_path_dangling_pointer [gw6] [ 7%] PASSED test_kerberos_auth/test.py::test_bad_path_to_keytab test_kerberos_auth/test.py::test_kerberos_auth_with_keytab [gw6] [ 8%] PASSED test_kerberos_auth/test.py::test_kerberos_auth_with_keytab test_kerberos_auth/test.py::test_kerberos_auth_without_keytab [gw6] [ 9%] PASSED test_kerberos_auth/test.py::test_kerberos_auth_without_keytab test_server_start_and_ip_conversions/test.py::test_restart_success_ipv4 [gw0] [ 10%] PASSED test_graphite_merge_tree_typed/test.py::test_path_dangling_pointer test_graphite_merge_tree_typed/test.py::test_paths_not_matching_any_pattern [gw0] [ 11%] PASSED test_graphite_merge_tree_typed/test.py::test_paths_not_matching_any_pattern test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_plain [gw0] [ 12%] PASSED test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_plain test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_tagged [gw0] [ 13%] PASSED test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_tagged test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_plain [gw0] [ 14%] PASSED test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_plain test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_tagged [gw0] [ 15%] PASSED test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_tagged test_graphite_merge_tree_typed/test.py::test_rollup_versions_all [gw0] [ 16%] PASSED test_graphite_merge_tree_typed/test.py::test_rollup_versions_all test_graphite_merge_tree_typed/test.py::test_rollup_versions_plain [gw0] [ 17%] PASSED test_graphite_merge_tree_typed/test.py::test_rollup_versions_plain test_graphite_merge_tree_typed/test.py::test_rollup_versions_tagged [gw0] [ 18%] PASSED test_graphite_merge_tree_typed/test.py::test_rollup_versions_tagged test_graphite_merge_tree_typed/test.py::test_rules_isolation [gw0] [ 19%] PASSED test_graphite_merge_tree_typed/test.py::test_rules_isolation test_graphite_merge_tree_typed/test.py::test_system_graphite_retentions [gw0] [ 20%] PASSED test_graphite_merge_tree_typed/test.py::test_system_graphite_retentions [gw9] [ 21%] PASSED test_keeper_znode_time/test.py::test_between_servers test_keeper_znode_time/test.py::test_server_restart test_insert_distributed_async_extra_dirs/test.py::test_insert_distributed_async_send_success [gw8] [ 22%] PASSED test_server_start_and_ip_conversions/test.py::test_restart_success_ipv4 test_server_start_and_ip_conversions/test.py::test_restart_success_ipv6 [gw5] [ 23%] PASSED test_restore_replica/test.py::test_restore_replica_alive_replicas test_restore_replica/test.py::test_restore_replica_invalid_tables [gw5] [ 24%] PASSED test_restore_replica/test.py::test_restore_replica_invalid_tables test_restore_replica/test.py::test_restore_replica_parallel [gw8] [ 25%] PASSED test_server_start_and_ip_conversions/test.py::test_restart_success_ipv6 [gw9] [ 26%] PASSED test_keeper_znode_time/test.py::test_server_restart [gw7] [ 27%] PASSED test_server_initialization/test.py::test_live_view_dependency test_server_initialization/test.py::test_partially_dropped_tables [gw7] [ 28%] PASSED test_server_initialization/test.py::test_partially_dropped_tables test_server_initialization/test.py::test_sophisticated_default [gw7] [ 29%] PASSED test_server_initialization/test.py::test_sophisticated_default test_inherit_multiple_profiles/test.py::test_combined_profile test_fetch_partition_should_reset_mutation/test.py::test_part_should_reset_mutation test_replica_can_become_leader/test.py::test_can_become_leader [gw5] [ 30%] PASSED test_restore_replica/test.py::test_restore_replica_parallel test_restore_replica/test.py::test_restore_replica_sequential [gw3] [ 31%] PASSED test_merge_tree_s3_failover/test.py::test_move_failover test_merge_tree_s3_failover/test.py::test_throttle_retry [gw3] [ 32%] PASSED test_merge_tree_s3_failover/test.py::test_throttle_retry test_merge_tree_s3_failover/test.py::test_write_failover[0-13-2] [gw8] [ 33%] PASSED test_inherit_multiple_profiles/test.py::test_combined_profile [gw5] [ 34%] PASSED test_restore_replica/test.py::test_restore_replica_sequential test_filesystem_layout/test.py::test_file_path_escaping test_odbc_interaction/test_exiled.py::test_bridge_dies_with_parent [gw0] [ 35%] PASSED test_insert_distributed_async_extra_dirs/test.py::test_insert_distributed_async_send_success [gw9] [ 36%] PASSED test_fetch_partition_should_reset_mutation/test.py::test_part_should_reset_mutation test_keeper_nodes_remove/test.py::test_nodes_remove [gw6] [ 37%] PASSED test_filesystem_layout/test.py::test_file_path_escaping [gw7] [ 38%] PASSED test_replica_can_become_leader/test.py::test_can_become_leader test_http_and_readonly/test.py::test_http_get_is_readonly [gw3] [ 39%] PASSED test_merge_tree_s3_failover/test.py::test_write_failover[0-13-2] test_merge_tree_s3_failover/test.py::test_write_failover[1048576-9-0] test_keeper_snapshot_small_distance/test.py::test_snapshot_and_load test_relative_filepath/test.py::test_filepath [gw4] [ 40%] PASSED test_reload_clusters_config/test.py::test_add_cluster test_reload_clusters_config/test.py::test_delete_cluster [gw8] [ 41%] PASSED test_odbc_interaction/test_exiled.py::test_bridge_dies_with_parent [gw6] [ 42%] PASSED test_http_and_readonly/test.py::test_http_get_is_readonly [gw3] [ 43%] PASSED test_merge_tree_s3_failover/test.py::test_write_failover[1048576-9-0] [gw5] [ 44%] PASSED test_relative_filepath/test.py::test_filepath test_s3_storage_class/test.py::test_s3_storage_class_right test_storage_kerberized_hdfs/test.py::test_cache_path test_replica_is_active/test.py::test_replica_is_active [gw2] [ 45%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_replication[s3] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_concurrent_merge [gw2] [ 46%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_concurrent_merge test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_alter [gw9] [ 47%] PASSED test_keeper_snapshot_small_distance/test.py::test_snapshot_and_load test_secure_socket/test.py::test [gw4] [ 48%] PASSED test_reload_clusters_config/test.py::test_delete_cluster test_reload_clusters_config/test.py::test_simple_reload [gw8] [ 49%] PASSED test_s3_storage_class/test.py::test_s3_storage_class_right [gw0] [ 50%] PASSED test_keeper_nodes_remove/test.py::test_nodes_remove test_mutations_with_projection/test.py::test_mutations_with_multi_level_merge_of_projections [gw7] [ 51%] PASSED test_replica_is_active/test.py::test_replica_is_active [gw2] [ 52%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_alter test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_system [gw0] [ 53%] PASSED test_mutations_with_projection/test.py::test_mutations_with_multi_level_merge_of_projections [gw4] [ 54%] PASSED test_reload_clusters_config/test.py::test_simple_reload test_reload_clusters_config/test.py::test_update_one_cluster [gw2] [ 55%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_system test_s3_zero_copy_replication/test.py::test_s3_zero_copy_keeps_data_after_mutation test_send_crash_reports/test.py::test_send_segfault [gw9] [ 56%] PASSED test_secure_socket/test.py::test [gw7] [ 57%] PASSED test_send_crash_reports/test.py::test_send_segfault [gw2] [ 58%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_keeps_data_after_mutation test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_alter [gw4] [ 59%] PASSED test_reload_clusters_config/test.py::test_update_one_cluster [gw2] [ 60%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_alter test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_system [gw5] [ 61%] PASSED test_storage_kerberized_hdfs/test.py::test_cache_path [gw2] [ 62%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_system test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[False-10] [gw2] [ 63%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[False-10] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[True-3] [gw2] [ 64%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[True-3] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-False-10] [gw2] [ 65%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-False-10] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-True-3] [gw2] [ 66%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-True-3] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-False-10] [gw2] [ 67%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-False-10] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-True-3] [gw2] [ 68%] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-True-3] [gw1] [ 69%] ERROR test_storage_hdfs/test.py::test_bad_hdfs_uri test_storage_hdfs/test.py::test_cluster_join [gw1] [ 70%] ERROR test_storage_hdfs/test.py::test_cluster_join test_storage_hdfs/test.py::test_cluster_macro [gw1] [ 71%] ERROR test_storage_hdfs/test.py::test_cluster_macro test_storage_hdfs/test.py::test_format_detection [gw1] [ 72%] ERROR test_storage_hdfs/test.py::test_format_detection test_storage_hdfs/test.py::test_globs_in_read_table [gw1] [ 73%] ERROR test_storage_hdfs/test.py::test_globs_in_read_table test_storage_hdfs/test.py::test_hdfsCluster [gw1] [ 74%] ERROR test_storage_hdfs/test.py::test_hdfsCluster test_storage_hdfs/test.py::test_hdfsCluster_skip_unavailable_shards [gw1] [ 75%] ERROR test_storage_hdfs/test.py::test_hdfsCluster_skip_unavailable_shards test_storage_hdfs/test.py::test_hdfsCluster_unskip_unavailable_shards [gw1] [ 76%] ERROR test_storage_hdfs/test.py::test_hdfsCluster_unskip_unavailable_shards test_storage_hdfs/test.py::test_hdfs_directory_not_exist [gw1] [ 77%] ERROR test_storage_hdfs/test.py::test_hdfs_directory_not_exist test_storage_hdfs/test.py::test_insert_select_schema_inference [gw1] [ 78%] ERROR test_storage_hdfs/test.py::test_insert_select_schema_inference test_storage_hdfs/test.py::test_multiple_inserts [gw1] [ 79%] ERROR test_storage_hdfs/test.py::test_multiple_inserts test_storage_hdfs/test.py::test_overwrite [gw1] [ 80%] ERROR test_storage_hdfs/test.py::test_overwrite test_storage_hdfs/test.py::test_partition_by [gw1] [ 81%] ERROR test_storage_hdfs/test.py::test_partition_by test_storage_hdfs/test.py::test_read_files_with_spaces [gw1] [ 82%] ERROR test_storage_hdfs/test.py::test_read_files_with_spaces test_storage_hdfs/test.py::test_read_table_with_default [gw1] [ 83%] ERROR test_storage_hdfs/test.py::test_read_table_with_default test_storage_hdfs/test.py::test_read_write_gzip_table [gw1] [ 84%] ERROR test_storage_hdfs/test.py::test_read_write_gzip_table test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_auto_gz [gw1] [ 85%] ERROR test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_auto_gz test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_gzip [gw1] [ 86%] ERROR test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_gzip test_storage_hdfs/test.py::test_read_write_storage [gw1] [ 87%] ERROR test_storage_hdfs/test.py::test_read_write_storage test_storage_hdfs/test.py::test_read_write_storage_with_globs [gw1] [ 88%] ERROR test_storage_hdfs/test.py::test_read_write_storage_with_globs test_storage_hdfs/test.py::test_read_write_table [gw1] [ 89%] ERROR test_storage_hdfs/test.py::test_read_write_table test_storage_hdfs/test.py::test_read_write_table_with_parameter_none [gw1] [ 90%] ERROR test_storage_hdfs/test.py::test_read_write_table_with_parameter_none test_storage_hdfs/test.py::test_schema_inference [gw1] [ 91%] ERROR test_storage_hdfs/test.py::test_schema_inference test_storage_hdfs/test.py::test_schema_inference_cache [gw1] [ 92%] ERROR test_storage_hdfs/test.py::test_schema_inference_cache test_storage_hdfs/test.py::test_schema_inference_with_globs [gw1] [ 93%] ERROR test_storage_hdfs/test.py::test_schema_inference_with_globs test_storage_hdfs/test.py::test_seekable_formats [gw1] [ 94%] ERROR test_storage_hdfs/test.py::test_seekable_formats test_storage_hdfs/test.py::test_truncate_table [gw1] [ 95%] ERROR test_storage_hdfs/test.py::test_truncate_table test_storage_hdfs/test.py::test_virtual_columns [gw1] [ 96%] ERROR test_storage_hdfs/test.py::test_virtual_columns test_storage_hdfs/test.py::test_virtual_columns_2 [gw1] [ 97%] ERROR test_storage_hdfs/test.py::test_virtual_columns_2 test_storage_hdfs/test.py::test_write_gz_storage [gw1] [ 98%] ERROR test_storage_hdfs/test.py::test_write_gz_storage test_storage_hdfs/test.py::test_write_gzip_storage [gw1] [ 99%] ERROR test_storage_hdfs/test.py::test_write_gzip_storage test_storage_hdfs/test.py::test_write_table [gw1] [100%] ERROR test_storage_hdfs/test.py::test_write_table ==================================== ERRORS ==================================== _____________________ ERROR at setup of test_bad_hdfs_uri ______________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ---------------------------- Captured stdout setup ----------------------------- Copy common default production configuration from /clickhouse-config. Files: config.xml, users.xml ---------------------------- Captured stderr setup ----------------------------- Command:['docker ps | wc -l'] Command:['docker ps | wc -l'] Stdout:1 Stdout:1 No running containers No running containers Running tests in /ClickHouse/tests/integration/test_storage_hdfs/test.py Running tests in /ClickHouse/tests/integration/test_storage_hdfs/test.py Cluster start called. is_up=False Cluster start called. is_up=False Docker networks for project rootteststoragehdfs are NETWORK ID NAME DRIVER SCOPE Docker networks for project rootteststoragehdfs are NETWORK ID NAME DRIVER SCOPE Docker containers for project rootteststoragehdfs are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES Docker containers for project rootteststoragehdfs are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES Docker volumes for project rootteststoragehdfs are DRIVER VOLUME NAME Docker volumes for project rootteststoragehdfs are DRIVER VOLUME NAME Cleanup called Cleanup called Docker networks for project rootteststoragehdfs are NETWORK ID NAME DRIVER SCOPE Docker networks for project rootteststoragehdfs are NETWORK ID NAME DRIVER SCOPE Docker containers for project rootteststoragehdfs are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES Docker containers for project rootteststoragehdfs are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES Docker volumes for project rootteststoragehdfs are DRIVER VOLUME NAME Docker volumes for project rootteststoragehdfs are DRIVER VOLUME NAME Command:docker container list --all --filter name='^/rootteststoragehdfs_.*_1$' --format '{{.ID}}:{{.Names}}' Command:docker container list --all --filter name='^/rootteststoragehdfs_.*_1$' --format '{{.ID}}:{{.Names}}' Unstopped containers: {} Unstopped containers: {} No running containers for project: rootteststoragehdfs No running containers for project: rootteststoragehdfs Trying to prune unused networks... Trying to prune unused networks... Trying to prune unused images... Trying to prune unused images... Command:['docker', 'image', 'prune', '-f'] Command:['docker', 'image', 'prune', '-f'] Stderr:Error response from daemon: a prune operation is already running Stderr:Error response from daemon: a prune operation is already running Exitcode:1 Exitcode:1 Trying to prune unused volumes... Trying to prune unused volumes... Command:['docker volume ls | wc -l'] Command:['docker volume ls | wc -l'] Stdout:1 Stdout:1 Setup directory for instance: node1 Setup directory for instance: node1 Create directory for configuration generated in this helper Create directory for configuration generated in this helper Create directory for common tests configuration Create directory for common tests configuration Copy common configuration from helpers Copy common configuration from helpers Generate and write macros file Generate and write macros file Copy custom test config files ['/ClickHouse/tests/integration/test_storage_hdfs/configs/macro.xml', '/ClickHouse/tests/integration/test_storage_hdfs/configs/schema_cache.xml', '/ClickHouse/tests/integration/test_storage_hdfs/configs/cluster.xml'] to /ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/configs/config.d Copy custom test config files ['/ClickHouse/tests/integration/test_storage_hdfs/configs/macro.xml', '/ClickHouse/tests/integration/test_storage_hdfs/configs/schema_cache.xml', '/ClickHouse/tests/integration/test_storage_hdfs/configs/cluster.xml'] to /ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/configs/config.d Setup database dir /ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/database Setup database dir /ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/database Setup logs dir /ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/logs Setup logs dir /ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/logs Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log"] Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log"] Env {'TSAN_OPTIONS': 'second_deadlock_stack=1', 'ASAN_OPTIONS': 'use_sigaltstack=0', 'CLICKHOUSE_WATCHDOG_ENABLE': '0', 'CLICKHOUSE_NATS_TLS_SECURE': '0', 'LLVM_PROFILE_FILE': '/var/lib/clickhouse/server_%h_%p_%m.profraw', 'HDFS_HOST': 'hdfs1', 'HDFS_NAME_PORT': '50070', 'HDFS_DATA_PORT': '50075', 'HDFS_LOGS': '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/hdfs/logs', 'HDFS_FS': 'bind'} stored in /ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env Env {'TSAN_OPTIONS': 'second_deadlock_stack=1', 'ASAN_OPTIONS': 'use_sigaltstack=0', 'CLICKHOUSE_WATCHDOG_ENABLE': '0', 'CLICKHOUSE_NATS_TLS_SECURE': '0', 'LLVM_PROFILE_FILE': '/var/lib/clickhouse/server_%h_%p_%m.profraw', 'HDFS_HOST': 'hdfs1', 'HDFS_NAME_PORT': '50070', 'HDFS_DATA_PORT': '50075', 'HDFS_LOGS': '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/hdfs/logs', 'HDFS_FS': 'bind'} stored in /ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] No config file found No config file found Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] No config file found No config file found http://localhost:None "GET /version HTTP/1.1" 200 824 http://localhost:None "GET /version HTTP/1.1" 200 824 Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] Stderr:Pulling hdfs1 ... Stderr:Pulling hdfs1 ... Stderr:Pulling node1 ... Stderr:Pulling node1 ... Stderr:Pulling node1 ... pulling from altinityinfra/integr... Stderr:Pulling node1 ... pulling from altinityinfra/integr... Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... Stderr:Pulling node1 ... status: image is up to date for a... Stderr:Pulling node1 ... status: image is up to date for a... Stderr:Pulling node1 ... done Stderr:Pulling node1 ... done Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... Stderr: Stderr: Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Exitcode:1 Exitcode:1 Got exception pulling images: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling hdfs1 ... Pulling node1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Got exception pulling images: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling hdfs1 ... Pulling node1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] Stderr:Pulling node1 ... Stderr:Pulling node1 ... Stderr:Pulling hdfs1 ... Stderr:Pulling hdfs1 ... Stderr:Pulling node1 ... pulling from altinityinfra/integr... Stderr:Pulling node1 ... pulling from altinityinfra/integr... Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... Stderr:Pulling node1 ... status: image is up to date for a... Stderr:Pulling node1 ... status: image is up to date for a... Stderr:Pulling node1 ... done Stderr:Pulling node1 ... done Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... Stderr: Stderr: Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Exitcode:1 Exitcode:1 Got exception pulling images: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... Pulling hdfs1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Got exception pulling images: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... Pulling hdfs1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] Stderr:Pulling hdfs1 ... Stderr:Pulling hdfs1 ... Stderr:Pulling node1 ... Stderr:Pulling node1 ... Stderr:Pulling node1 ... pulling from altinityinfra/integr... Stderr:Pulling node1 ... pulling from altinityinfra/integr... Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... Stderr:Pulling node1 ... status: image is up to date for a... Stderr:Pulling node1 ... status: image is up to date for a... Stderr:Pulling node1 ... done Stderr:Pulling node1 ... done Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... Stderr: Stderr: Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Exitcode:1 Exitcode:1 Got exception pulling images: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling hdfs1 ... Pulling node1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Got exception pulling images: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling hdfs1 ... Pulling node1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] Stderr:Pulling hdfs1 ... Stderr:Pulling hdfs1 ... Stderr:Pulling node1 ... Stderr:Pulling node1 ... Stderr:Pulling node1 ... pulling from altinityinfra/integr... Stderr:Pulling node1 ... pulling from altinityinfra/integr... Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... Stderr:Pulling node1 ... status: image is up to date for a... Stderr:Pulling node1 ... status: image is up to date for a... Stderr:Pulling node1 ... done Stderr:Pulling node1 ... done Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... Stderr: Stderr: Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Exitcode:1 Exitcode:1 Got exception pulling images: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling hdfs1 ... Pulling node1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Got exception pulling images: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling hdfs1 ... Pulling node1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] Stderr:Pulling node1 ... Stderr:Pulling node1 ... Stderr:Pulling hdfs1 ... Stderr:Pulling hdfs1 ... Stderr:Pulling node1 ... pulling from altinityinfra/integr... Stderr:Pulling node1 ... pulling from altinityinfra/integr... Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... Stderr:Pulling node1 ... status: image is up to date for a... Stderr:Pulling node1 ... status: image is up to date for a... Stderr:Pulling node1 ... done Stderr:Pulling node1 ... done Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... Stderr: Stderr: Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Exitcode:1 Exitcode:1 Failed to start cluster: Failed to start cluster: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... Pulling hdfs1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... Pulling hdfs1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ Traceback (most recent call last): File "/ClickHouse/tests/integration/helpers/cluster.py", line 2548, in start raise ex File "/ClickHouse/tests/integration/helpers/cluster.py", line 2544, in start run_and_check(images_pull_cmd) File "/ClickHouse/tests/integration/helpers/cluster.py", line 113, in run_and_check raise Exception( Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... Pulling hdfs1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ None None docker-compose up was not called. Trying to export docker.log for running containers docker-compose up was not called. Trying to export docker.log for running containers Cleanup called Cleanup called Docker networks for project rootteststoragehdfs are NETWORK ID NAME DRIVER SCOPE Docker networks for project rootteststoragehdfs are NETWORK ID NAME DRIVER SCOPE Docker containers for project rootteststoragehdfs are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES Docker containers for project rootteststoragehdfs are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES Docker volumes for project rootteststoragehdfs are DRIVER VOLUME NAME Docker volumes for project rootteststoragehdfs are DRIVER VOLUME NAME Command:docker container list --all --filter name='^/rootteststoragehdfs_.*_1$' --format '{{.ID}}:{{.Names}}' Command:docker container list --all --filter name='^/rootteststoragehdfs_.*_1$' --format '{{.ID}}:{{.Names}}' Unstopped containers: {} Unstopped containers: {} No running containers for project: rootteststoragehdfs No running containers for project: rootteststoragehdfs Trying to prune unused networks... Trying to prune unused networks... Trying to prune unused images... Trying to prune unused images... Command:['docker', 'image', 'prune', '-f'] Command:['docker', 'image', 'prune', '-f'] Stdout:Total reclaimed space: 0B Stdout:Total reclaimed space: 0B Images pruned Images pruned Trying to prune unused volumes... Trying to prune unused volumes... Command:['docker volume ls | wc -l'] Command:['docker volume ls | wc -l'] Stdout:1 Stdout:1 docker-compose up was not called. Trying to export docker.log for running containers docker-compose up was not called. Trying to export docker.log for running containers Cleanup called Cleanup called Docker networks for project rootteststoragehdfs are NETWORK ID NAME DRIVER SCOPE Docker networks for project rootteststoragehdfs are NETWORK ID NAME DRIVER SCOPE Docker containers for project rootteststoragehdfs are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES Docker containers for project rootteststoragehdfs are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES Docker volumes for project rootteststoragehdfs are DRIVER VOLUME NAME Docker volumes for project rootteststoragehdfs are DRIVER VOLUME NAME Command:docker container list --all --filter name='^/rootteststoragehdfs_.*_1$' --format '{{.ID}}:{{.Names}}' Command:docker container list --all --filter name='^/rootteststoragehdfs_.*_1$' --format '{{.ID}}:{{.Names}}' Unstopped containers: {} Unstopped containers: {} No running containers for project: rootteststoragehdfs No running containers for project: rootteststoragehdfs Trying to prune unused networks... Trying to prune unused networks... Trying to prune unused images... Trying to prune unused images... Command:['docker', 'image', 'prune', '-f'] Command:['docker', 'image', 'prune', '-f'] Stdout:Total reclaimed space: 0B Stdout:Total reclaimed space: 0B Images pruned Images pruned Trying to prune unused volumes... Trying to prune unused volumes... Command:['docker volume ls | wc -l'] Command:['docker volume ls | wc -l'] Stdout:1 Stdout:1 ------------------------------ Captured log setup ------------------------------ 2024-03-27 15:04:20 [ 366 ] DEBUG : Command:['docker ps | wc -l'] (cluster.py:97, run_and_check) 2024-03-27 15:04:20 [ 366 ] DEBUG : Stdout:1 (cluster.py:105, run_and_check) 2024-03-27 15:04:20 [ 366 ] DEBUG : No running containers (conftest.py:44, cleanup_environment) 2024-03-27 15:04:20 [ 366 ] INFO : Running tests in /ClickHouse/tests/integration/test_storage_hdfs/test.py (cluster.py:2508, start) 2024-03-27 15:04:20 [ 366 ] DEBUG : Cluster start called. is_up=False (cluster.py:2515, start) 2024-03-27 15:04:20 [ 366 ] DEBUG : Docker networks for project rootteststoragehdfs are NETWORK ID NAME DRIVER SCOPE (cluster.py:633, print_all_docker_pieces) 2024-03-27 15:04:20 [ 366 ] DEBUG : Docker containers for project rootteststoragehdfs are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:641, print_all_docker_pieces) 2024-03-27 15:04:20 [ 366 ] DEBUG : Docker volumes for project rootteststoragehdfs are DRIVER VOLUME NAME (cluster.py:649, print_all_docker_pieces) 2024-03-27 15:04:20 [ 366 ] DEBUG : Cleanup called (cluster.py:654, cleanup) 2024-03-27 15:04:20 [ 366 ] DEBUG : Docker networks for project rootteststoragehdfs are NETWORK ID NAME DRIVER SCOPE (cluster.py:633, print_all_docker_pieces) 2024-03-27 15:04:20 [ 366 ] DEBUG : Docker containers for project rootteststoragehdfs are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:641, print_all_docker_pieces) 2024-03-27 15:04:20 [ 366 ] DEBUG : Docker volumes for project rootteststoragehdfs are DRIVER VOLUME NAME (cluster.py:649, print_all_docker_pieces) 2024-03-27 15:04:20 [ 366 ] DEBUG : Command:docker container list --all --filter name='^/rootteststoragehdfs_.*_1$' --format '{{.ID}}:{{.Names}}' (cluster.py:97, run_and_check) 2024-03-27 15:04:20 [ 366 ] DEBUG : Unstopped containers: {} (cluster.py:668, cleanup) 2024-03-27 15:04:20 [ 366 ] DEBUG : No running containers for project: rootteststoragehdfs (cluster.py:682, cleanup) 2024-03-27 15:04:20 [ 366 ] DEBUG : Trying to prune unused networks... (cluster.py:688, cleanup) 2024-03-27 15:04:20 [ 366 ] DEBUG : Trying to prune unused images... (cluster.py:704, cleanup) 2024-03-27 15:04:20 [ 366 ] DEBUG : Command:['docker', 'image', 'prune', '-f'] (cluster.py:97, run_and_check) 2024-03-27 15:04:20 [ 366 ] DEBUG : Stderr:Error response from daemon: a prune operation is already running (cluster.py:107, run_and_check) 2024-03-27 15:04:20 [ 366 ] DEBUG : Exitcode:1 (cluster.py:109, run_and_check) 2024-03-27 15:04:20 [ 366 ] DEBUG : Trying to prune unused volumes... (cluster.py:713, cleanup) 2024-03-27 15:04:20 [ 366 ] DEBUG : Command:['docker volume ls | wc -l'] (cluster.py:97, run_and_check) 2024-03-27 15:04:20 [ 366 ] DEBUG : Stdout:1 (cluster.py:105, run_and_check) 2024-03-27 15:04:20 [ 366 ] DEBUG : Setup directory for instance: node1 (cluster.py:2528, start) 2024-03-27 15:04:20 [ 366 ] DEBUG : Create directory for configuration generated in this helper (cluster.py:4146, create_dir) 2024-03-27 15:04:20 [ 366 ] DEBUG : Create directory for common tests configuration (cluster.py:4151, create_dir) 2024-03-27 15:04:20 [ 366 ] DEBUG : Copy common configuration from helpers (cluster.py:4171, create_dir) 2024-03-27 15:04:20 [ 366 ] DEBUG : Generate and write macros file (cluster.py:4184, create_dir) 2024-03-27 15:04:20 [ 366 ] DEBUG : Copy custom test config files ['/ClickHouse/tests/integration/test_storage_hdfs/configs/macro.xml', '/ClickHouse/tests/integration/test_storage_hdfs/configs/schema_cache.xml', '/ClickHouse/tests/integration/test_storage_hdfs/configs/cluster.xml'] to /ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/configs/config.d (cluster.py:4215, create_dir) 2024-03-27 15:04:20 [ 366 ] DEBUG : Setup database dir /ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/database (cluster.py:4232, create_dir) 2024-03-27 15:04:20 [ 366 ] DEBUG : Setup logs dir /ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/logs (cluster.py:4243, create_dir) 2024-03-27 15:04:20 [ 366 ] DEBUG : Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log"] (cluster.py:4319, create_dir) 2024-03-27 15:04:20 [ 366 ] DEBUG : Env {'TSAN_OPTIONS': 'second_deadlock_stack=1', 'ASAN_OPTIONS': 'use_sigaltstack=0', 'CLICKHOUSE_WATCHDOG_ENABLE': '0', 'CLICKHOUSE_NATS_TLS_SECURE': '0', 'LLVM_PROFILE_FILE': '/var/lib/clickhouse/server_%h_%p_%m.profraw', 'HDFS_HOST': 'hdfs1', 'HDFS_NAME_PORT': '50070', 'HDFS_DATA_PORT': '50075', 'HDFS_LOGS': '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/hdfs/logs', 'HDFS_FS': 'bind'} stored in /ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env (cluster.py:70, _create_env_file) 2024-03-27 15:04:20 [ 366 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2024-03-27 15:04:20 [ 366 ] DEBUG : No config file found (config.py:28, find_config_file) 2024-03-27 15:04:20 [ 366 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2024-03-27 15:04:20 [ 366 ] DEBUG : No config file found (config.py:28, find_config_file) 2024-03-27 15:04:20 [ 366 ] DEBUG : http://localhost:None "GET /version HTTP/1.1" 200 824 (connectionpool.py:429, _make_request) 2024-03-27 15:04:20 [ 366 ] DEBUG : Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] (cluster.py:97, run_and_check) 2024-03-27 15:05:14 [ 366 ] DEBUG : Stderr:Pulling hdfs1 ... (cluster.py:107, run_and_check) 2024-03-27 15:05:14 [ 366 ] DEBUG : Stderr:Pulling node1 ... (cluster.py:107, run_and_check) 2024-03-27 15:05:14 [ 366 ] DEBUG : Stderr:Pulling node1 ... pulling from altinityinfra/integr... (cluster.py:107, run_and_check) 2024-03-27 15:05:14 [ 366 ] DEBUG : Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... (cluster.py:107, run_and_check) 2024-03-27 15:05:14 [ 366 ] DEBUG : Stderr:Pulling node1 ... status: image is up to date for a... (cluster.py:107, run_and_check) 2024-03-27 15:05:14 [ 366 ] DEBUG : Stderr:Pulling node1 ... done (cluster.py:107, run_and_check) 2024-03-27 15:05:14 [ 366 ] DEBUG : Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... (cluster.py:107, run_and_check) 2024-03-27 15:05:14 [ 366 ] DEBUG : Stderr: (cluster.py:107, run_and_check) 2024-03-27 15:05:14 [ 366 ] DEBUG : Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:107, run_and_check) 2024-03-27 15:05:14 [ 366 ] DEBUG : Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:107, run_and_check) 2024-03-27 15:05:14 [ 366 ] DEBUG : Exitcode:1 (cluster.py:109, run_and_check) 2024-03-27 15:05:14 [ 366 ] INFO : Got exception pulling images: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling hdfs1 ... Pulling node1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:2549, start) 2024-03-27 15:05:14 [ 366 ] DEBUG : Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] (cluster.py:97, run_and_check) 2024-03-27 15:06:07 [ 366 ] DEBUG : Stderr:Pulling node1 ... (cluster.py:107, run_and_check) 2024-03-27 15:06:07 [ 366 ] DEBUG : Stderr:Pulling hdfs1 ... (cluster.py:107, run_and_check) 2024-03-27 15:06:07 [ 366 ] DEBUG : Stderr:Pulling node1 ... pulling from altinityinfra/integr... (cluster.py:107, run_and_check) 2024-03-27 15:06:07 [ 366 ] DEBUG : Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... (cluster.py:107, run_and_check) 2024-03-27 15:06:07 [ 366 ] DEBUG : Stderr:Pulling node1 ... status: image is up to date for a... (cluster.py:107, run_and_check) 2024-03-27 15:06:07 [ 366 ] DEBUG : Stderr:Pulling node1 ... done (cluster.py:107, run_and_check) 2024-03-27 15:06:07 [ 366 ] DEBUG : Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... (cluster.py:107, run_and_check) 2024-03-27 15:06:07 [ 366 ] DEBUG : Stderr: (cluster.py:107, run_and_check) 2024-03-27 15:06:07 [ 366 ] DEBUG : Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:107, run_and_check) 2024-03-27 15:06:07 [ 366 ] DEBUG : Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:107, run_and_check) 2024-03-27 15:06:07 [ 366 ] DEBUG : Exitcode:1 (cluster.py:109, run_and_check) 2024-03-27 15:06:07 [ 366 ] INFO : Got exception pulling images: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... Pulling hdfs1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:2549, start) 2024-03-27 15:06:10 [ 366 ] DEBUG : Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] (cluster.py:97, run_and_check) 2024-03-27 15:07:04 [ 366 ] DEBUG : Stderr:Pulling hdfs1 ... (cluster.py:107, run_and_check) 2024-03-27 15:07:04 [ 366 ] DEBUG : Stderr:Pulling node1 ... (cluster.py:107, run_and_check) 2024-03-27 15:07:04 [ 366 ] DEBUG : Stderr:Pulling node1 ... pulling from altinityinfra/integr... (cluster.py:107, run_and_check) 2024-03-27 15:07:04 [ 366 ] DEBUG : Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... (cluster.py:107, run_and_check) 2024-03-27 15:07:04 [ 366 ] DEBUG : Stderr:Pulling node1 ... status: image is up to date for a... (cluster.py:107, run_and_check) 2024-03-27 15:07:04 [ 366 ] DEBUG : Stderr:Pulling node1 ... done (cluster.py:107, run_and_check) 2024-03-27 15:07:04 [ 366 ] DEBUG : Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... (cluster.py:107, run_and_check) 2024-03-27 15:07:04 [ 366 ] DEBUG : Stderr: (cluster.py:107, run_and_check) 2024-03-27 15:07:04 [ 366 ] DEBUG : Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:107, run_and_check) 2024-03-27 15:07:04 [ 366 ] DEBUG : Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:107, run_and_check) 2024-03-27 15:07:04 [ 366 ] DEBUG : Exitcode:1 (cluster.py:109, run_and_check) 2024-03-27 15:07:04 [ 366 ] INFO : Got exception pulling images: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling hdfs1 ... Pulling node1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:2549, start) 2024-03-27 15:07:10 [ 366 ] DEBUG : Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] (cluster.py:97, run_and_check) 2024-03-27 15:08:04 [ 366 ] DEBUG : Stderr:Pulling hdfs1 ... (cluster.py:107, run_and_check) 2024-03-27 15:08:04 [ 366 ] DEBUG : Stderr:Pulling node1 ... (cluster.py:107, run_and_check) 2024-03-27 15:08:04 [ 366 ] DEBUG : Stderr:Pulling node1 ... pulling from altinityinfra/integr... (cluster.py:107, run_and_check) 2024-03-27 15:08:04 [ 366 ] DEBUG : Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... (cluster.py:107, run_and_check) 2024-03-27 15:08:04 [ 366 ] DEBUG : Stderr:Pulling node1 ... status: image is up to date for a... (cluster.py:107, run_and_check) 2024-03-27 15:08:04 [ 366 ] DEBUG : Stderr:Pulling node1 ... done (cluster.py:107, run_and_check) 2024-03-27 15:08:04 [ 366 ] DEBUG : Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... (cluster.py:107, run_and_check) 2024-03-27 15:08:04 [ 366 ] DEBUG : Stderr: (cluster.py:107, run_and_check) 2024-03-27 15:08:04 [ 366 ] DEBUG : Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:107, run_and_check) 2024-03-27 15:08:04 [ 366 ] DEBUG : Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:107, run_and_check) 2024-03-27 15:08:04 [ 366 ] DEBUG : Exitcode:1 (cluster.py:109, run_and_check) 2024-03-27 15:08:04 [ 366 ] INFO : Got exception pulling images: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling hdfs1 ... Pulling node1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:2549, start) 2024-03-27 15:08:13 [ 366 ] DEBUG : Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] (cluster.py:97, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Stderr:Pulling node1 ... (cluster.py:107, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Stderr:Pulling hdfs1 ... (cluster.py:107, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Stderr:Pulling node1 ... pulling from altinityinfra/integr... (cluster.py:107, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Stderr:Pulling node1 ... digest: sha256:0a374a389fa493e61c... (cluster.py:107, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Stderr:Pulling node1 ... status: image is up to date for a... (cluster.py:107, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Stderr:Pulling node1 ... done (cluster.py:107, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Stderr:Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... (cluster.py:107, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Stderr: (cluster.py:107, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Stderr:ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:107, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Stderr:[DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:107, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Exitcode:1 (cluster.py:109, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Failed to start cluster: (cluster.py:2880, start) 2024-03-27 15:09:06 [ 366 ] DEBUG : Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... Pulling hdfs1 ... Pulling node1 ... pulling from altinityinfra/integr... Pulling node1 ... digest: sha256:0a374a389fa493e61c... Pulling node1 ... status: image is up to date for a... Pulling node1 ... done Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ (cluster.py:2881, start) 2024-03-27 15:09:06 [ 366 ] DEBUG : None (cluster.py:2882, start) 2024-03-27 15:09:06 [ 366 ] WARNING : docker-compose up was not called. Trying to export docker.log for running containers (cluster.py:2948, shutdown) 2024-03-27 15:09:06 [ 366 ] DEBUG : Cleanup called (cluster.py:654, cleanup) 2024-03-27 15:09:06 [ 366 ] DEBUG : Docker networks for project rootteststoragehdfs are NETWORK ID NAME DRIVER SCOPE (cluster.py:633, print_all_docker_pieces) 2024-03-27 15:09:06 [ 366 ] DEBUG : Docker containers for project rootteststoragehdfs are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:641, print_all_docker_pieces) 2024-03-27 15:09:06 [ 366 ] DEBUG : Docker volumes for project rootteststoragehdfs are DRIVER VOLUME NAME (cluster.py:649, print_all_docker_pieces) 2024-03-27 15:09:06 [ 366 ] DEBUG : Command:docker container list --all --filter name='^/rootteststoragehdfs_.*_1$' --format '{{.ID}}:{{.Names}}' (cluster.py:97, run_and_check) 2024-03-27 15:09:06 [ 366 ] DEBUG : Unstopped containers: {} (cluster.py:668, cleanup) 2024-03-27 15:09:06 [ 366 ] DEBUG : No running containers for project: rootteststoragehdfs (cluster.py:682, cleanup) 2024-03-27 15:09:06 [ 366 ] DEBUG : Trying to prune unused networks... (cluster.py:688, cleanup) 2024-03-27 15:09:07 [ 366 ] DEBUG : Trying to prune unused images... (cluster.py:704, cleanup) 2024-03-27 15:09:07 [ 366 ] DEBUG : Command:['docker', 'image', 'prune', '-f'] (cluster.py:97, run_and_check) 2024-03-27 15:09:07 [ 366 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:105, run_and_check) 2024-03-27 15:09:07 [ 366 ] DEBUG : Images pruned (cluster.py:707, cleanup) 2024-03-27 15:09:07 [ 366 ] DEBUG : Trying to prune unused volumes... (cluster.py:713, cleanup) 2024-03-27 15:09:07 [ 366 ] DEBUG : Command:['docker volume ls | wc -l'] (cluster.py:97, run_and_check) 2024-03-27 15:09:07 [ 366 ] DEBUG : Stdout:1 (cluster.py:105, run_and_check) 2024-03-27 15:09:07 [ 366 ] WARNING : docker-compose up was not called. Trying to export docker.log for running containers (cluster.py:2948, shutdown) 2024-03-27 15:09:07 [ 366 ] DEBUG : Cleanup called (cluster.py:654, cleanup) 2024-03-27 15:09:07 [ 366 ] DEBUG : Docker networks for project rootteststoragehdfs are NETWORK ID NAME DRIVER SCOPE (cluster.py:633, print_all_docker_pieces) 2024-03-27 15:09:07 [ 366 ] DEBUG : Docker containers for project rootteststoragehdfs are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:641, print_all_docker_pieces) 2024-03-27 15:09:07 [ 366 ] DEBUG : Docker volumes for project rootteststoragehdfs are DRIVER VOLUME NAME (cluster.py:649, print_all_docker_pieces) 2024-03-27 15:09:07 [ 366 ] DEBUG : Command:docker container list --all --filter name='^/rootteststoragehdfs_.*_1$' --format '{{.ID}}:{{.Names}}' (cluster.py:97, run_and_check) 2024-03-27 15:09:07 [ 366 ] DEBUG : Unstopped containers: {} (cluster.py:668, cleanup) 2024-03-27 15:09:07 [ 366 ] DEBUG : No running containers for project: rootteststoragehdfs (cluster.py:682, cleanup) 2024-03-27 15:09:07 [ 366 ] DEBUG : Trying to prune unused networks... (cluster.py:688, cleanup) 2024-03-27 15:09:07 [ 366 ] DEBUG : Trying to prune unused images... (cluster.py:704, cleanup) 2024-03-27 15:09:07 [ 366 ] DEBUG : Command:['docker', 'image', 'prune', '-f'] (cluster.py:97, run_and_check) 2024-03-27 15:09:07 [ 366 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:105, run_and_check) 2024-03-27 15:09:07 [ 366 ] DEBUG : Images pruned (cluster.py:707, cleanup) 2024-03-27 15:09:07 [ 366 ] DEBUG : Trying to prune unused volumes... (cluster.py:713, cleanup) 2024-03-27 15:09:07 [ 366 ] DEBUG : Command:['docker volume ls | wc -l'] (cluster.py:97, run_and_check) 2024-03-27 15:09:07 [ 366 ] DEBUG : Stdout:1 (cluster.py:105, run_and_check) _____________________ ERROR at setup of test_cluster_join ______________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception _____________________ ERROR at setup of test_cluster_macro _____________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ___________________ ERROR at setup of test_format_detection ____________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception __________________ ERROR at setup of test_globs_in_read_table __________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ______________________ ERROR at setup of test_hdfsCluster ______________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception __________ ERROR at setup of test_hdfsCluster_skip_unavailable_shards __________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception _________ ERROR at setup of test_hdfsCluster_unskip_unavailable_shards _________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception _______________ ERROR at setup of test_hdfs_directory_not_exist ________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ____________ ERROR at setup of test_insert_select_schema_inference _____________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ___________________ ERROR at setup of test_multiple_inserts ____________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception _______________________ ERROR at setup of test_overwrite _______________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception _____________________ ERROR at setup of test_partition_by ______________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ________________ ERROR at setup of test_read_files_with_spaces _________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ________________ ERROR at setup of test_read_table_with_default ________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception _________________ ERROR at setup of test_read_write_gzip_table _________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception _____ ERROR at setup of test_read_write_gzip_table_with_parameter_auto_gz ______ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception _______ ERROR at setup of test_read_write_gzip_table_with_parameter_gzip _______ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception __________________ ERROR at setup of test_read_write_storage ___________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception _____________ ERROR at setup of test_read_write_storage_with_globs _____________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ___________________ ERROR at setup of test_read_write_table ____________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception _________ ERROR at setup of test_read_write_table_with_parameter_none __________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ___________________ ERROR at setup of test_schema_inference ____________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ________________ ERROR at setup of test_schema_inference_cache _________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ______________ ERROR at setup of test_schema_inference_with_globs ______________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ___________________ ERROR at setup of test_seekable_formats ____________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ____________________ ERROR at setup of test_truncate_table _____________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ____________________ ERROR at setup of test_virtual_columns ____________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ___________________ ERROR at setup of test_virtual_columns_2 ___________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ___________________ ERROR at setup of test_write_gz_storage ____________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception __________________ ERROR at setup of test_write_gzip_storage ___________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception ______________________ ERROR at setup of test_write_table ______________________ [gw1] linux -- Python 3.8.10 /usr/bin/python3 @pytest.fixture(scope="module") def started_cluster(): try: > cluster.start() test_storage_hdfs/test.py:24: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:2548: in start raise ex helpers/cluster.py:2544: in start run_and_check(images_pull_cmd) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', ...] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args, env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ): if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return logging.debug(f"Command:{args}") res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout ) out = res.stdout.decode("utf-8") err = res.stderr.decode("utf-8") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug(f"Stdout:{outline}") for errline in err.splitlines(): logging.debug(f"Stderr:{errline}") if res.returncode != 0: logging.debug(f"Exitcode:{res.returncode}") if env: logging.debug(f"Env:{env}") if not nothrow: > raise Exception( f"Command {args} return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... E Pulling hdfs1 ... E Pulling node1 ... pulling from altinityinfra/integr... E Pulling node1 ... digest: sha256:0a374a389fa493e61c... E Pulling node1 ... status: image is up to date for a... E Pulling node1 ... done E Pulling hdfs1 ... pulling from sequenceiq/hadoop-do... E E ERROR: for hdfs1 [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ E [DEPRECATION NOTICE] Docker Image Format v1 and Docker Image manifest version 2, schema 1 support is disabled by default and will be removed in an upcoming release. Suggest the author of docker.io/sequenceiq/hadoop-docker:2.7.0 to upgrade the image to the OCI Format or Docker Image manifest v2, schema 2. More information at https://docs.docker.com/go/deprecated-image-specs/ helpers/cluster.py:113: Exception =============================== warnings summary =============================== test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-True-3] /usr/local/lib/python3.8/dist-packages/_pytest/threadexception.py:77: PytestUnhandledThreadExceptionWarning: Exception in thread Thread-9 Traceback (most recent call last): File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner self.run() File "/usr/lib/python3.8/threading.py", line 870, in run self._target(*self._args, **self._kwargs) File "/usr/local/lib/python3.8/dist-packages/kazoo/protocol/connection.py", line 544, in zk_loop if retry(self._connect_loop, retry) is STOP_CONNECTING: File "/usr/local/lib/python3.8/dist-packages/kazoo/retry.py", line 132, in __call__ return func(*args, **kwargs) File "/usr/local/lib/python3.8/dist-packages/kazoo/protocol/connection.py", line 587, in _connect_loop status = self._connect_attempt(host, hostip, port, retry) File "/usr/local/lib/python3.8/dist-packages/kazoo/protocol/connection.py", line 640, in _connect_attempt response = self._read_socket(read_timeout) File "/usr/local/lib/python3.8/dist-packages/kazoo/protocol/connection.py", line 485, in _read_socket return self._read_response(header, buffer, offset) File "/usr/local/lib/python3.8/dist-packages/kazoo/protocol/connection.py", line 400, in _read_response request, async_object, xid = client._pending.popleft() IndexError: pop from an empty deque warnings.warn(pytest.PytestUnhandledThreadExceptionWarning(msg)) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html ============================== slowest durations =============================== 286.66s setup test_storage_hdfs/test.py::test_bad_hdfs_uri 79.75s setup test_storage_kerberized_hdfs/test.py::test_cache_path 43.88s call test_keeper_nodes_remove/test.py::test_nodes_remove 43.26s teardown test_storage_kerberized_hdfs/test.py::test_cache_path 40.95s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_replication[s3] 38.91s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_replication[s3] 32.85s call test_reload_clusters_config/test.py::test_add_cluster 32.78s call test_reload_clusters_config/test.py::test_delete_cluster 32.77s call test_reload_clusters_config/test.py::test_update_one_cluster 28.41s setup test_merge_tree_s3_failover/test.py::test_move_failover 27.95s setup test_server_initialization/test.py::test_live_view_dependency 27.78s setup test_reload_clusters_config/test.py::test_add_cluster 27.07s setup test_restore_replica/test.py::test_restore_replica_alive_replicas 26.71s setup test_s3_storage_class/test.py::test_s3_storage_class_right 26.64s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_keeps_data_after_mutation 24.08s teardown test_kerberos_auth/test.py::test_kerberos_auth_without_keytab 23.03s call test_keeper_snapshot_small_distance/test.py::test_snapshot_and_load 22.50s teardown test_restore_replica/test.py::test_restore_replica_sequential 22.03s teardown test_merge_tree_s3_failover/test.py::test_write_failover[1048576-9-0] 21.97s setup test_replica_is_active/test.py::test_replica_is_active 21.88s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-True-3] 21.83s teardown test_s3_storage_class/test.py::test_s3_storage_class_right 19.33s call test_merge_tree_s3_failover/test.py::test_write_failover[0-13-2] 19.11s teardown test_reload_clusters_config/test.py::test_update_one_cluster 18.48s teardown test_fetch_partition_should_reset_mutation/test.py::test_part_should_reset_mutation 18.45s teardown test_replica_can_become_leader/test.py::test_can_become_leader 18.25s teardown test_replica_is_active/test.py::test_replica_is_active 17.87s setup test_odbc_interaction/test_exiled.py::test_bridge_dies_with_parent 17.84s setup test_replica_can_become_leader/test.py::test_can_become_leader 17.58s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-False-10] 17.20s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-False-10] 16.99s call test_secure_socket/test.py::test 16.77s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[False-10] 16.39s call test_reload_clusters_config/test.py::test_simple_reload 15.05s setup test_kerberos_auth/test.py::test_bad_path_to_keytab 14.80s setup test_keeper_znode_time/test.py::test_between_servers 12.90s setup test_graphite_merge_tree_typed/test.py::test_combined_rules 12.77s setup test_format_schema_on_server/test.py::test_protobuf_format_input 12.76s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_alter 12.66s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_system 12.64s setup test_secure_socket/test.py::test 12.33s setup test_http_and_readonly/test.py::test_http_get_is_readonly 12.23s setup test_insert_distributed_async_extra_dirs/test.py::test_insert_distributed_async_send_success 11.89s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-True-3] 11.75s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-True-3] 11.69s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[True-3] 10.11s call test_merge_tree_s3_failover/test.py::test_write_failover[1048576-9-0] 10.08s setup test_mutations_with_projection/test.py::test_mutations_with_multi_level_merge_of_projections 9.79s call test_keeper_znode_time/test.py::test_server_restart 8.29s setup test_filesystem_layout/test.py::test_file_path_escaping 7.38s setup test_keeper_nodes_remove/test.py::test_nodes_remove 7.30s teardown test_secure_socket/test.py::test 7.08s setup test_send_crash_reports/test.py::test_send_segfault 6.92s setup test_server_start_and_ip_conversions/test.py::test_restart_success_ipv4 6.90s call test_keeper_znode_time/test.py::test_between_servers 6.59s setup test_keeper_snapshot_small_distance/test.py::test_snapshot_and_load 6.05s call test_merge_tree_s3_failover/test.py::test_move_failover 6.00s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_system 5.78s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_concurrent_merge 5.77s call test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_alter 5.58s setup test_fetch_partition_should_reset_mutation/test.py::test_part_should_reset_mutation 4.89s call test_odbc_interaction/test_exiled.py::test_bridge_dies_with_parent 4.39s call test_server_start_and_ip_conversions/test.py::test_restart_success_ipv4 4.21s setup test_relative_filepath/test.py::test_filepath 4.08s call test_insert_distributed_async_extra_dirs/test.py::test_insert_distributed_async_send_success 4.07s call test_replica_is_active/test.py::test_replica_is_active 3.94s call test_restore_replica/test.py::test_restore_replica_sequential 3.84s call test_server_initialization/test.py::test_live_view_dependency 3.81s setup test_inherit_multiple_profiles/test.py::test_combined_profile 3.76s call test_server_start_and_ip_conversions/test.py::test_restart_success_ipv6 3.70s teardown test_odbc_interaction/test_exiled.py::test_bridge_dies_with_parent 3.63s teardown test_graphite_merge_tree_typed/test.py::test_system_graphite_retentions 3.48s call test_restore_replica/test.py::test_restore_replica_parallel 3.34s call test_restore_replica/test.py::test_restore_replica_alive_replicas 3.14s teardown test_format_schema_on_server/test.py::test_protobuf_format_output 3.09s teardown test_filesystem_layout/test.py::test_file_path_escaping 3.05s call test_graphite_merge_tree_typed/test.py::test_path_dangling_pointer 2.85s teardown test_mutations_with_projection/test.py::test_mutations_with_multi_level_merge_of_projections 2.76s teardown test_http_and_readonly/test.py::test_http_get_is_readonly 2.67s teardown test_relative_filepath/test.py::test_filepath 2.59s teardown test_inherit_multiple_profiles/test.py::test_combined_profile 2.49s teardown test_send_crash_reports/test.py::test_send_segfault 2.37s call test_send_crash_reports/test.py::test_send_segfault 2.11s call test_mutations_with_projection/test.py::test_mutations_with_multi_level_merge_of_projections 2.04s call test_fetch_partition_should_reset_mutation/test.py::test_part_should_reset_mutation 2.03s teardown test_keeper_znode_time/test.py::test_server_restart 1.80s teardown test_server_start_and_ip_conversions/test.py::test_restart_success_ipv6 1.80s teardown test_keeper_nodes_remove/test.py::test_nodes_remove 1.73s teardown test_keeper_snapshot_small_distance/test.py::test_snapshot_and_load 1.59s teardown test_server_initialization/test.py::test_sophisticated_default 1.55s teardown test_insert_distributed_async_extra_dirs/test.py::test_insert_distributed_async_send_success 0.92s call test_filesystem_layout/test.py::test_file_path_escaping 0.43s call test_inherit_multiple_profiles/test.py::test_combined_profile 0.43s call test_graphite_merge_tree_typed/test.py::test_combined_rules 0.37s teardown test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_tagged 0.37s teardown test_merge_tree_s3_failover/test.py::test_write_failover[0-13-2] 0.31s call test_format_schema_on_server/test.py::test_protobuf_format_input 0.30s call test_relative_filepath/test.py::test_filepath 0.28s call test_graphite_merge_tree_typed/test.py::test_rollup_versions_plain 0.28s call test_graphite_merge_tree_typed/test.py::test_rollup_versions_all 0.28s call test_kerberos_auth/test.py::test_bad_path_to_keytab 0.27s call test_graphite_merge_tree_typed/test.py::test_rollup_versions_tagged 0.26s call test_merge_tree_s3_failover/test.py::test_throttle_retry 0.26s call test_s3_storage_class/test.py::test_s3_storage_class_right 0.25s call test_kerberos_auth/test.py::test_kerberos_auth_without_keytab 0.25s call test_graphite_merge_tree_typed/test.py::test_multiple_output_blocks 0.24s call test_server_initialization/test.py::test_partially_dropped_tables 0.21s call test_graphite_merge_tree_typed/test.py::test_rules_isolation 0.20s call test_graphite_merge_tree_typed/test.py::test_system_graphite_retentions 0.19s call test_storage_kerberized_hdfs/test.py::test_cache_path 0.18s call test_format_schema_on_server/test.py::test_protobuf_format_output 0.17s call test_restore_replica/test.py::test_restore_replica_invalid_tables 0.17s call test_kerberos_auth/test.py::test_kerberos_auth_with_keytab 0.16s call test_http_and_readonly/test.py::test_http_get_is_readonly 0.14s call test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_tagged 0.14s call test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_plain 0.14s call test_graphite_merge_tree_typed/test.py::test_paths_not_matching_any_pattern 0.14s call test_replica_can_become_leader/test.py::test_can_become_leader 0.12s call test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_plain 0.12s setup test_graphite_merge_tree_typed/test.py::test_multiple_output_blocks 0.12s setup test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_plain 0.11s call test_server_initialization/test.py::test_sophisticated_default 0.07s setup test_graphite_merge_tree_typed/test.py::test_path_dangling_pointer 0.07s teardown test_merge_tree_s3_failover/test.py::test_move_failover 0.07s setup test_graphite_merge_tree_typed/test.py::test_paths_not_matching_any_pattern 0.07s teardown test_graphite_merge_tree_typed/test.py::test_paths_not_matching_any_pattern 0.07s teardown test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_plain 0.07s teardown test_graphite_merge_tree_typed/test.py::test_rollup_versions_plain 0.07s teardown test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_plain 0.07s setup test_graphite_merge_tree_typed/test.py::test_system_graphite_retentions 0.07s setup test_graphite_merge_tree_typed/test.py::test_rollup_versions_tagged 0.07s setup test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_tagged 0.07s call test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_plain 0.07s setup test_graphite_merge_tree_typed/test.py::test_rollup_versions_plain 0.07s setup test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_plain 0.07s call test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_tagged 0.07s setup test_graphite_merge_tree_typed/test.py::test_rollup_versions_all 0.07s teardown test_merge_tree_s3_failover/test.py::test_throttle_retry 0.07s setup test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_plain 0.07s teardown test_graphite_merge_tree_typed/test.py::test_rollup_versions_tagged 0.07s teardown test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_tagged 0.07s teardown test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_plain 0.07s setup test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_tagged 0.07s teardown test_graphite_merge_tree_typed/test.py::test_combined_rules 0.07s teardown test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_tagged 0.07s teardown test_graphite_merge_tree_typed/test.py::test_multiple_output_blocks 0.07s teardown test_graphite_merge_tree_typed/test.py::test_rollup_versions_all 0.07s setup test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_tagged 0.07s call test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_tagged 0.07s setup test_graphite_merge_tree_typed/test.py::test_rules_isolation 0.07s teardown test_graphite_merge_tree_typed/test.py::test_path_dangling_pointer 0.07s teardown test_graphite_merge_tree_typed/test.py::test_rules_isolation 0.01s teardown test_keeper_znode_time/test.py::test_between_servers 0.00s teardown test_restore_replica/test.py::test_restore_replica_parallel 0.00s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_replication[s3] 0.00s teardown test_restore_replica/test.py::test_restore_replica_alive_replicas 0.00s teardown test_reload_clusters_config/test.py::test_add_cluster 0.00s setup test_merge_tree_s3_failover/test.py::test_write_failover[1048576-9-0] 0.00s teardown test_storage_hdfs/test.py::test_write_table 0.00s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-False-10] 0.00s setup test_merge_tree_s3_failover/test.py::test_write_failover[0-13-2] 0.00s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[True-3] 0.00s teardown test_kerberos_auth/test.py::test_bad_path_to_keytab 0.00s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-True-3] 0.00s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-False-10] 0.00s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-False-10] 0.00s setup test_server_start_and_ip_conversions/test.py::test_restart_success_ipv6 0.00s setup test_restore_replica/test.py::test_restore_replica_sequential 0.00s setup test_restore_replica/test.py::test_restore_replica_parallel 0.00s setup test_merge_tree_s3_failover/test.py::test_throttle_retry 0.00s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-False-10] 0.00s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-True-3] 0.00s teardown test_server_start_and_ip_conversions/test.py::test_restart_success_ipv4 0.00s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[False-10] 0.00s teardown test_server_initialization/test.py::test_live_view_dependency 0.00s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_concurrent_merge 0.00s teardown test_format_schema_on_server/test.py::test_protobuf_format_input 0.00s setup test_server_initialization/test.py::test_sophisticated_default 0.00s setup test_server_initialization/test.py::test_partially_dropped_tables 0.00s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[False-10] 0.00s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_system 0.00s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_keeps_data_after_mutation 0.00s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-True-3] 0.00s setup test_kerberos_auth/test.py::test_kerberos_auth_without_keytab 0.00s setup test_format_schema_on_server/test.py::test_protobuf_format_output 0.00s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_alter 0.00s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_concurrent_merge 0.00s setup test_reload_clusters_config/test.py::test_delete_cluster 0.00s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[True-3] 0.00s setup test_keeper_znode_time/test.py::test_server_restart 0.00s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_alter 0.00s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_keeps_data_after_mutation 0.00s setup test_storage_hdfs/test.py::test_hdfsCluster_skip_unavailable_shards 0.00s setup test_storage_hdfs/test.py::test_globs_in_read_table 0.00s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_alter 0.00s teardown test_storage_hdfs/test.py::test_hdfsCluster_skip_unavailable_shards 0.00s setup test_kerberos_auth/test.py::test_kerberos_auth_with_keytab 0.00s teardown test_storage_hdfs/test.py::test_schema_inference_with_globs 0.00s teardown test_storage_hdfs/test.py::test_bad_hdfs_uri 0.00s setup test_reload_clusters_config/test.py::test_simple_reload 0.00s teardown test_storage_hdfs/test.py::test_virtual_columns_2 0.00s teardown test_storage_hdfs/test.py::test_write_gz_storage 0.00s setup test_storage_hdfs/test.py::test_hdfsCluster_unskip_unavailable_shards 0.00s setup test_storage_hdfs/test.py::test_multiple_inserts 0.00s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_alter 0.00s teardown test_storage_hdfs/test.py::test_hdfsCluster 0.00s teardown test_storage_hdfs/test.py::test_hdfs_directory_not_exist 0.00s teardown test_storage_hdfs/test.py::test_read_write_gzip_table 0.00s teardown test_storage_hdfs/test.py::test_read_write_table 0.00s teardown test_storage_hdfs/test.py::test_cluster_join 0.00s teardown test_storage_hdfs/test.py::test_format_detection 0.00s teardown test_storage_hdfs/test.py::test_multiple_inserts 0.00s teardown test_storage_hdfs/test.py::test_virtual_columns 0.00s teardown test_storage_hdfs/test.py::test_truncate_table 0.00s teardown test_storage_hdfs/test.py::test_seekable_formats 0.00s teardown test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_system 0.00s teardown test_storage_hdfs/test.py::test_write_gzip_storage 0.00s teardown test_storage_hdfs/test.py::test_globs_in_read_table 0.00s teardown test_storage_hdfs/test.py::test_cluster_macro 0.00s setup test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_gzip 0.00s teardown test_storage_hdfs/test.py::test_read_write_table_with_parameter_none 0.00s teardown test_storage_hdfs/test.py::test_read_files_with_spaces 0.00s teardown test_storage_hdfs/test.py::test_insert_select_schema_inference 0.00s teardown test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_auto_gz 0.00s teardown test_storage_hdfs/test.py::test_read_write_storage_with_globs 0.00s setup test_storage_hdfs/test.py::test_write_gzip_storage 0.00s teardown test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_gzip 0.00s setup test_storage_hdfs/test.py::test_insert_select_schema_inference 0.00s teardown test_storage_hdfs/test.py::test_hdfsCluster_unskip_unavailable_shards 0.00s teardown test_reload_clusters_config/test.py::test_simple_reload 0.00s setup test_reload_clusters_config/test.py::test_update_one_cluster 0.00s teardown test_storage_hdfs/test.py::test_partition_by 0.00s teardown test_storage_hdfs/test.py::test_overwrite 0.00s teardown test_reload_clusters_config/test.py::test_delete_cluster 0.00s teardown test_storage_hdfs/test.py::test_read_table_with_default 0.00s teardown test_storage_hdfs/test.py::test_read_write_storage 0.00s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_system 0.00s teardown test_storage_hdfs/test.py::test_schema_inference_cache 0.00s teardown test_storage_hdfs/test.py::test_schema_inference 0.00s teardown test_kerberos_auth/test.py::test_kerberos_auth_with_keytab 0.00s teardown test_restore_replica/test.py::test_restore_replica_invalid_tables 0.00s setup test_storage_hdfs/test.py::test_read_files_with_spaces 0.00s setup test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_auto_gz 0.00s setup test_storage_hdfs/test.py::test_cluster_macro 0.00s setup test_restore_replica/test.py::test_restore_replica_invalid_tables 0.00s setup test_storage_hdfs/test.py::test_overwrite 0.00s setup test_storage_hdfs/test.py::test_schema_inference_with_globs 0.00s setup test_storage_hdfs/test.py::test_write_table 0.00s setup test_storage_hdfs/test.py::test_cluster_join 0.00s setup test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_system 0.00s setup test_storage_hdfs/test.py::test_read_write_gzip_table 0.00s setup test_storage_hdfs/test.py::test_read_write_storage 0.00s setup test_storage_hdfs/test.py::test_read_write_table_with_parameter_none 0.00s setup test_storage_hdfs/test.py::test_truncate_table 0.00s setup test_storage_hdfs/test.py::test_virtual_columns 0.00s setup test_storage_hdfs/test.py::test_hdfsCluster 0.00s setup test_storage_hdfs/test.py::test_virtual_columns_2 0.00s setup test_storage_hdfs/test.py::test_partition_by 0.00s setup test_storage_hdfs/test.py::test_seekable_formats 0.00s setup test_storage_hdfs/test.py::test_read_write_storage_with_globs 0.00s setup test_storage_hdfs/test.py::test_read_table_with_default 0.00s setup test_storage_hdfs/test.py::test_read_write_table 0.00s setup test_storage_hdfs/test.py::test_write_gz_storage 0.00s setup test_storage_hdfs/test.py::test_format_detection 0.00s setup test_storage_hdfs/test.py::test_schema_inference 0.00s setup test_storage_hdfs/test.py::test_schema_inference_cache 0.00s setup test_storage_hdfs/test.py::test_hdfs_directory_not_exist 0.00s teardown test_server_initialization/test.py::test_partially_dropped_tables =========================== short test summary info ============================ ERROR test_storage_hdfs/test.py::test_bad_hdfs_uri - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_cluster_join - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_cluster_macro - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_format_detection - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_globs_in_read_table - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_hdfsCluster - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_hdfsCluster_skip_unavailable_shards - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_hdfsCluster_unskip_unavailable_shards ERROR test_storage_hdfs/test.py::test_hdfs_directory_not_exist - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_insert_select_schema_inference - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_multiple_inserts - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_overwrite - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_partition_by - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_read_files_with_spaces - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_read_table_with_default - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_read_write_gzip_table - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_auto_gz ERROR test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_gzip ERROR test_storage_hdfs/test.py::test_read_write_storage - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_read_write_storage_with_globs - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_read_write_table - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_read_write_table_with_parameter_none - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_schema_inference - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_schema_inference_cache - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_schema_inference_with_globs - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_seekable_formats - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_truncate_table - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_virtual_columns - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_virtual_columns_2 - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_write_gz_storage - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_write_gzip_storage - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... ERROR test_storage_hdfs/test.py::test_write_table - Exception: Command ['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/.env', '--project-name', 'rootteststoragehdfs', '--file', '/ClickHouse/tests/integration/test_storage_hdfs/_instances_0/node1/docker-compose.yml', '--file', '/compose/docker_compose_hdfs.yml', 'pull'] return non-zero code 1: Pulling node1 ... PASSED test_format_schema_on_server/test.py::test_protobuf_format_input PASSED test_format_schema_on_server/test.py::test_protobuf_format_output PASSED test_graphite_merge_tree_typed/test.py::test_combined_rules PASSED test_graphite_merge_tree_typed/test.py::test_multiple_output_blocks PASSED test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_plain PASSED test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_tagged PASSED test_kerberos_auth/test.py::test_bad_path_to_keytab PASSED test_kerberos_auth/test.py::test_kerberos_auth_with_keytab PASSED test_kerberos_auth/test.py::test_kerberos_auth_without_keytab PASSED test_graphite_merge_tree_typed/test.py::test_path_dangling_pointer PASSED test_graphite_merge_tree_typed/test.py::test_paths_not_matching_any_pattern PASSED test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_plain PASSED test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_tagged PASSED test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_plain PASSED test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_tagged PASSED test_graphite_merge_tree_typed/test.py::test_rollup_versions_all PASSED test_graphite_merge_tree_typed/test.py::test_rollup_versions_plain PASSED test_graphite_merge_tree_typed/test.py::test_rollup_versions_tagged PASSED test_graphite_merge_tree_typed/test.py::test_rules_isolation PASSED test_graphite_merge_tree_typed/test.py::test_system_graphite_retentions PASSED test_keeper_znode_time/test.py::test_between_servers PASSED test_server_start_and_ip_conversions/test.py::test_restart_success_ipv4 PASSED test_restore_replica/test.py::test_restore_replica_alive_replicas PASSED test_restore_replica/test.py::test_restore_replica_invalid_tables PASSED test_server_start_and_ip_conversions/test.py::test_restart_success_ipv6 PASSED test_keeper_znode_time/test.py::test_server_restart PASSED test_server_initialization/test.py::test_live_view_dependency PASSED test_server_initialization/test.py::test_partially_dropped_tables PASSED test_server_initialization/test.py::test_sophisticated_default PASSED test_restore_replica/test.py::test_restore_replica_parallel PASSED test_merge_tree_s3_failover/test.py::test_move_failover PASSED test_merge_tree_s3_failover/test.py::test_throttle_retry PASSED test_inherit_multiple_profiles/test.py::test_combined_profile PASSED test_restore_replica/test.py::test_restore_replica_sequential PASSED test_insert_distributed_async_extra_dirs/test.py::test_insert_distributed_async_send_success PASSED test_fetch_partition_should_reset_mutation/test.py::test_part_should_reset_mutation PASSED test_filesystem_layout/test.py::test_file_path_escaping PASSED test_replica_can_become_leader/test.py::test_can_become_leader PASSED test_merge_tree_s3_failover/test.py::test_write_failover[0-13-2] PASSED test_reload_clusters_config/test.py::test_add_cluster PASSED test_odbc_interaction/test_exiled.py::test_bridge_dies_with_parent PASSED test_http_and_readonly/test.py::test_http_get_is_readonly PASSED test_merge_tree_s3_failover/test.py::test_write_failover[1048576-9-0] PASSED test_relative_filepath/test.py::test_filepath PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_replication[s3] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_concurrent_merge PASSED test_keeper_snapshot_small_distance/test.py::test_snapshot_and_load PASSED test_reload_clusters_config/test.py::test_delete_cluster PASSED test_s3_storage_class/test.py::test_s3_storage_class_right PASSED test_keeper_nodes_remove/test.py::test_nodes_remove PASSED test_replica_is_active/test.py::test_replica_is_active PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_alter PASSED test_mutations_with_projection/test.py::test_mutations_with_multi_level_merge_of_projections PASSED test_reload_clusters_config/test.py::test_simple_reload PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_system PASSED test_secure_socket/test.py::test PASSED test_send_crash_reports/test.py::test_send_segfault PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_keeps_data_after_mutation PASSED test_reload_clusters_config/test.py::test_update_one_cluster PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_alter PASSED test_storage_kerberized_hdfs/test.py::test_cache_path PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_system PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[False-10] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[True-3] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-False-10] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-True-3] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-False-10] PASSED test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-True-3] ============= 68 passed, 1 warning, 32 errors in 291.97s (0:04:51) ============= Traceback (most recent call last): File "/home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration/./runner", line 448, in subprocess.check_call(cmd, shell=True) File "/usr/lib/python3.10/subprocess.py", line 369, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'docker run --rm --name clickhouse_integration_tests_4pasye --privileged --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-odbc-bridge:/clickhouse-odbc-bridge --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-library-bridge:/clickhouse-library-bridge --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/src/Server/grpc_protos:/ClickHouse/src/Server/grpc_protos --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e XTABLES_LOCKFILE=/run/host/xtables.lock -e PYTHONUNBUFFERED=1 -e DOCKER_DOTNET_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_HELPER_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_BASE_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_KERBERIZED_HADOOP_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_KERBEROS_KDC_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_JAVA_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_JS_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_PHP_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e PYTEST_OPTS='--dist=loadfile -n 10 -rfEps --run-id=0 --color=no --durations=0 test_fetch_partition_should_reset_mutation/test.py::test_part_should_reset_mutation test_filesystem_layout/test.py::test_file_path_escaping test_format_schema_on_server/test.py::test_protobuf_format_input test_format_schema_on_server/test.py::test_protobuf_format_output test_graphite_merge_tree_typed/test.py::test_combined_rules test_graphite_merge_tree_typed/test.py::test_multiple_output_blocks test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_plain test_graphite_merge_tree_typed/test.py::test_multiple_paths_and_versions_tagged test_graphite_merge_tree_typed/test.py::test_path_dangling_pointer test_graphite_merge_tree_typed/test.py::test_paths_not_matching_any_pattern test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_plain test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_2_tagged test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_plain test_graphite_merge_tree_typed/test.py::test_rollup_aggregation_tagged test_graphite_merge_tree_typed/test.py::test_rollup_versions_all test_graphite_merge_tree_typed/test.py::test_rollup_versions_plain test_graphite_merge_tree_typed/test.py::test_rollup_versions_tagged test_graphite_merge_tree_typed/test.py::test_rules_isolation test_graphite_merge_tree_typed/test.py::test_system_graphite_retentions test_http_and_readonly/test.py::test_http_get_is_readonly test_inherit_multiple_profiles/test.py::test_combined_profile test_insert_distributed_async_extra_dirs/test.py::test_insert_distributed_async_send_success test_keeper_nodes_remove/test.py::test_nodes_remove test_keeper_snapshot_small_distance/test.py::test_snapshot_and_load test_keeper_znode_time/test.py::test_between_servers test_keeper_znode_time/test.py::test_server_restart test_kerberos_auth/test.py::test_bad_path_to_keytab test_kerberos_auth/test.py::test_kerberos_auth_with_keytab test_kerberos_auth/test.py::test_kerberos_auth_without_keytab test_merge_tree_s3_failover/test.py::test_move_failover test_merge_tree_s3_failover/test.py::test_throttle_retry test_merge_tree_s3_failover/test.py::test_write_failover[0-13-2] test_merge_tree_s3_failover/test.py::test_write_failover[1048576-9-0] test_mutations_with_projection/test.py::test_mutations_with_multi_level_merge_of_projections test_odbc_interaction/test_exiled.py::test_bridge_dies_with_parent test_relative_filepath/test.py::test_filepath test_reload_clusters_config/test.py::test_add_cluster test_reload_clusters_config/test.py::test_delete_cluster test_reload_clusters_config/test.py::test_simple_reload test_reload_clusters_config/test.py::test_update_one_cluster test_replica_can_become_leader/test.py::test_can_become_leader test_replica_is_active/test.py::test_replica_is_active test_restore_replica/test.py::test_restore_replica_alive_replicas test_restore_replica/test.py::test_restore_replica_invalid_tables test_restore_replica/test.py::test_restore_replica_parallel test_restore_replica/test.py::test_restore_replica_sequential test_s3_storage_class/test.py::test_s3_storage_class_right test_s3_zero_copy_replication/test.py::test_s3_zero_copy_concurrent_merge test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_alter test_s3_zero_copy_replication/test.py::test_s3_zero_copy_drop_detached_system test_s3_zero_copy_replication/test.py::test_s3_zero_copy_keeps_data_after_mutation test_s3_zero_copy_replication/test.py::test_s3_zero_copy_replication[s3] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_alter test_s3_zero_copy_replication/test.py::test_s3_zero_copy_unfreeze_system test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[False-10] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_delete[True-3] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-False-10] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered-True-3] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-False-10] test_s3_zero_copy_replication/test.py::test_s3_zero_copy_with_ttl_move[tiered_copy-True-3] test_secure_socket/test.py::test test_send_crash_reports/test.py::test_send_segfault test_server_initialization/test.py::test_live_view_dependency test_server_initialization/test.py::test_partially_dropped_tables test_server_initialization/test.py::test_sophisticated_default test_server_start_and_ip_conversions/test.py::test_restart_success_ipv4 test_server_start_and_ip_conversions/test.py::test_restart_success_ipv6 test_storage_hdfs/test.py::test_bad_hdfs_uri test_storage_hdfs/test.py::test_cluster_join test_storage_hdfs/test.py::test_cluster_macro test_storage_hdfs/test.py::test_format_detection test_storage_hdfs/test.py::test_globs_in_read_table test_storage_hdfs/test.py::test_hdfsCluster test_storage_hdfs/test.py::test_hdfsCluster_skip_unavailable_shards test_storage_hdfs/test.py::test_hdfsCluster_unskip_unavailable_shards test_storage_hdfs/test.py::test_hdfs_directory_not_exist test_storage_hdfs/test.py::test_insert_select_schema_inference test_storage_hdfs/test.py::test_multiple_inserts test_storage_hdfs/test.py::test_overwrite test_storage_hdfs/test.py::test_partition_by test_storage_hdfs/test.py::test_read_files_with_spaces test_storage_hdfs/test.py::test_read_table_with_default test_storage_hdfs/test.py::test_read_write_gzip_table test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_auto_gz test_storage_hdfs/test.py::test_read_write_gzip_table_with_parameter_gzip test_storage_hdfs/test.py::test_read_write_storage test_storage_hdfs/test.py::test_read_write_storage_with_globs test_storage_hdfs/test.py::test_read_write_table test_storage_hdfs/test.py::test_read_write_table_with_parameter_none test_storage_hdfs/test.py::test_schema_inference test_storage_hdfs/test.py::test_schema_inference_cache test_storage_hdfs/test.py::test_schema_inference_with_globs test_storage_hdfs/test.py::test_seekable_formats test_storage_hdfs/test.py::test_truncate_table test_storage_hdfs/test.py::test_virtual_columns test_storage_hdfs/test.py::test_virtual_columns_2 test_storage_hdfs/test.py::test_write_gz_storage test_storage_hdfs/test.py::test_write_gzip_storage test_storage_hdfs/test.py::test_write_table test_storage_kerberized_hdfs/test.py::test_cache_path -vvv' altinityinfra/integration-tests-runner:0-0a8ac3b092733da37e3e2a0079c486938a36790d ' returned non-zero exit status 1.